131 research outputs found

    Algorithmic States of Exception

    Get PDF
    In this paper I argue that pervasive tracking and data-mining are leading to shifts in governmentality that can be characterised as algorithmic states of exception. I also argue that the apparatus that performs this change owes as much to everyday business models as it does to mass surveillance. I look at technical changes at the level of data structures, such as the move to NoSQL databases, and how this combines with data-mining and machine learning to accelerate the use of prediction as a form of governance. The consequent confusion between correlation and causation leads, I assert, to the creation of states of exception. I set out what I mean by states of exception using the ideas of Giorgio Agamben, focusing on the aspects most relevant to algorithmic regulation: force-of and topology. I argue that the effects of these states of exception escape legal constraints such as concepts of privacy. Having characterised this as a potentially totalising change and an erosion of civil liberties, I ask in what ways the states of exception might be opposed. I follow Agamben by drawing on Walter Benjamin's concept of pure means as a tactic that is itself outside the frame of law-producing or law-preserving activity. However, the urgent need to respond requires more than a philosophical stance, and I examine two examples of historical resistance that satisfy Benjamin's criteria. For each in turn I draw connections to contemporary cases of digital dissent that exhibit some of the same characteristics. I conclude that it is possible both theoretically and practically to resist the coming states of exception and I end by warning what is at stake if we do not

    A Qualitative Evaluation of IoT-driven eHealth: Knowledge Management, Business Models and Opportunities, Deployment and Evolution

    Get PDF
    eHealth has a major potential, and its adoption may be considered necessary to achieve increased ambulant and remote medical care, increased quality, reduced personnel needs, and reduced costs potential in healthcare. In this paper the authors try to give a reasonable, qualitative evaluation of IoT-driven eHealth from theoretical and practical viewpoints. They look at associated knowledge management issues and contributions of IoT to eHealth, along with requirements, benefits, limitations and entry barriers. Important attention is given to security and privacy issues. Finally, the conditions for business plans and accompanying value chains are realistically analyzed. The resulting implementation issues and required commitments are also discussed based on a case study analysis. The authors confirm that IoT-driven eHealth can happen and will happen; however, much more needs to be addressed to bring it back in sync with medical and general technological developments in an industrial state-of-the-art perspective and to get recognized and get timely the benefits

    Bioinformatics and the politics of innovation in the life sciences: Science and the state in the United Kingdom, China, and India

    Get PDF
    The governments of China, India, and the United Kingdom are unanimous in their belief that bioinformatics should supply the link between basic life sciences research and its translation into health benefits for the population and the economy. Yet at the same time, as ambitious states vying for position in the future global bioeconomy they differ considerably in the strategies adopted in pursuit of this goal. At the heart of these differences lies the interaction between epistemic change within the scientific community itself and the apparatus of the state. Drawing on desk-based research and thirty-two interviews with scientists and policy makers in the three countries, this article analyzes the politics that shape this interaction. From this analysis emerges an understanding of the variable capacities of different kinds of states and political systems to work with science in harnessing the potential of new epistemic territories in global life sciences innovation

    Separation versus affiliation with partial vertical ownership in network industries

    Get PDF
    The separation of integrated monopolies and new market entrants have changed vertical interactions between suppliers and dealers. Firms have substituted full integration with vertical restraints leading to collusive behaviour harmful to competition. We examine how a partial vertical ownership (an affiliation) of one of the competing downstream retailers by the upstream monopoly could help internalise the production decision after a complete divestiture. Our results in a Cournot framework confirm the positive role of partial integration on firms' profits and consumer surplus in increasing social welfare. These results are consistent with empirical studies of economies after vertical separation in network industries

    Artificial Intelligence in Education

    Get PDF
    Artificial Intelligence (AI) technologies have been researched in educational contexts for more than 30 years (Woolf 1988; Cumming and McDougall 2000; du Boulay 2016). More recently, commercial AI products have also entered the classroom. However, while many assume that Artificial Intelligence in Education (AIED) means students taught by robot teachers, the reality is more prosaic yet still has the potential to be transformative (Holmes et al. 2019). This chapter introduces AIED, an approach that has so far received little mainstream attention, both as a set of technologies and as a field of inquiry. It discusses AIED’s AI foundations, its use of models, its possible future, and the human context. It begins with some brief examples of AIED technologies

    Evaluating the potential of agent-based modelling to capture consumer grocery retail store choice behaviours

    Get PDF
    Evolving consumer behaviours with regards to store and channel choice, shopping frequency, shopping mission and spending heighten the need for robust spatial modelling tools for use within retail analytics. In this paper, we report on collaboration with a major UK grocery retailer to assess the feasibility of modelling consumer store choice behaviours at the level of the individual consumer. We benefit from very rare access to our collaborating retailers’ customer data which we use to develop a proof-of-concept agent-based model (ABM). Utilising our collaborating retailers’ loyalty card database, we extract key consumer behaviours in relation to shopping frequency, mission, store choice and spending. We build these observed behaviours into our ABM, based on a simplified urban environment, calibrated and validated against observed consumer data. Our ABM is able to capture key spatiotemporal drivers of consumer store choice behaviour at the individual level. Our findings could afford new opportunities for spatial modelling within the retail sector, enabling the complexity of consumer behaviours to be captured and simulated within a novel modelling framework. We reflect on further model development required for use in a commercial context for location-based decision-making

    Digital methodologies of education governance:Pearson plc and the remediation of methods

    Get PDF
    This article analyses the rise of software systems in education governance, focusing on digital methods in the collection, calculation and circulation of educational data. It examines how software-mediated methods intervene in the ways educational institutions and actors are seen, known and acted upon through an analysis of the methodological complex of Pearson Education’s Learning Curve data-bank and its Center for Digital Data, Analytics and Adaptive Learning. This calls for critical attention to the ‘social life’ of its methods in terms of their historical, technical and methodological provenance; their affordances to generate data for circulation within the institutional circuitry of Pearson and to its wider social networks; their capacity to configure research users’ interpretations; and their generativity to produce the knowledge to influence education policy decisions and pedagogic practices. The purpose of the article is to critically survey the digital methods being mobilized by Pearson to generate educational data, and to examine how its methodological complex acts to produce a new data-based knowledge infrastructure for education. The consequence of this shift to data-based forms of digital education governance by Pearson is a challenge to the legitimacy of the social sciences in the theorization and understanding of learning, and its displacement to the authority of the data sciences

    The Anthropocene, Resilience and Post-Colonial Computation

    Get PDF
    What forms of politics come with the contested ideas of the Anthropocene and resilience? Rather than taking these ideas as a given and looking at their political consequences, I will ask what politics enters at their points of construction, where they are understood as being constructed computationally. This allows me to read across from the Anthropocene and resilience to the other forms of computational anticipation that are becoming pervasive at the level of everyday life. As truth claims that depend on algorithms, I will argue that all of these constructions derive their authority from an entanglement of computation and science. Under current conditions, this entanglement brings it's own political tendencies, which can be characterised as colonial. To counter this implicit colonialism I will draw on the feminist and post-colonial approaches of standpoint theory. I believe this offers an alternative to the current entanglements of anticipatory computation, and allows us to re-work it into a post-colonial politics of algorithms and atmospheres

    QuantCrit: education, policy, ‘Big Data’ and principles for a critical race theory of statistics

    Get PDF
    Quantitative research enjoys heightened esteem among policy-makers, media and the general public. Whereas qualitative research is frequently dismissed as subjective and impressionistic, statistics are often assumed to be objective and factual. We argue that these distinctions are wholly false; quantitative data is no less socially constructed than any other form of research material. The first part of the paper presents a conceptual critique of the field with empirical examples that expose and challenge hidden assumptions that frequently encode racist perspectives beneath the façade of supposed quantitative objectivity. The second part of the paper draws on the tenets of Critical Race Theory (CRT) to set out some principles to guide the future use and analysis of quantitative data. These ‘QuantCrit’ ideas concern (1) the centrality of racism as a complex and deeply-rooted aspect of society that is not readily amenable to quantification; (2) numbers are not neutral and should be interrogated for their role in promoting deficit analyses that serve White racial interests; (3) categories are neither ‘natural’ nor given and so the units and forms of analysis must be critically evaluated; (4) voice and insight are vital: data cannot ‘speak for itself’ and critical analyses should be informed by the experiential knowledge of marginalized groups; (5) statistical analyses have no inherent value but can play a role in struggles for social justice
    • 

    corecore